AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Multilingual embedding optimization

# Multilingual embedding optimization

Xlm Roberta Base Uk
MIT
This is a reduced version of the XLM-RoBERTa model, specifically optimized for Ukrainian and partial English, with parameters reduced from 470 million to 134 million.
Large Language Model Transformers Other
X
ukr-models
78
12
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase